Bayesian Optimization with a Prior for the Optimum

نویسندگان

چکیده

While Bayesian Optimization (BO) is a very popular method for optimizing expensive black-box functions, it fails to leverage the experience of domain experts. This causes BO waste function evaluations on bad design choices (e.g., machine learning hyperparameters) that expert already knows work poorly. To address this issue, we introduce with Prior Optimum (BOPrO). BOPrO allows users inject their knowledge into optimization process in form priors about which parts input space will yield best performance, rather than BO’s standard over are much less intuitive users. then combines these probabilistic model pseudo-posterior used select points evaluate next. We show around \(6.67\times \) faster state-of-the-art methods common suite benchmarks, and achieves new performance real-world hardware application. also converges even if optimum not entirely accurate robustly recovers from misleading priors.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Fuzzy Hypothesis Testing with Imprecise Prior Distribution

This paper considers the testing of fuzzy hypotheses on the basis of a Bayesian approach. For this, using a notion of prior distribution with interval or fuzzy-valued parameters, we extend a concept of posterior probability of a fuzzy hypothesis. Some of its properties are also put into investigation. The feasibility and effectiveness of the proposed methods are also cla...

متن کامل

Bayesian Optimum Design Criterion for Multi Models Discrimination

The problem of obtaining the optimum design, which is able to discriminate between several rival models has been considered in this paper. We give an optimality-criterion, using a Bayesian approach. This is an extension of the Bayesian KL-optimality to more than two models. A modification is made to deal with nested models. The proposed Bayesian optimality criterion is a weighted average, where...

متن کامل

Bayesian Inference for Spiking Neuron Models with a Sparsity Prior

Generalized linear models are the most commonly used tools to describe the stimulus selectivity of sensory neurons. Here we present a Bayesian treatment of such models. Using the expectation propagation algorithm, we are able to approximate the full posterior distribution over all weights. In addition, we use a Laplacian prior to favor sparse solutions. Therefore, stimulus features that do not ...

متن کامل

Bayesian Optimization with Automatic Prior Selection for Data-Efficient Direct Policy Search

One of the most interesting features of Bayesian optimization for direct policy search is that it can leverage priors (e.g., from simulation or from previous tasks) to accelerate learning on a robot. In this paper, we are interested in situations for which several priors exist but we do not know in advance which one fits best the current situation. We tackle this problem by introducing a novel ...

متن کامل

A Bayesian Divergence Prior for Classiffier Adaptation

OMB requires that influential scientific information on which a Federal Agency relies in a rule-making proceeding be subject to peer review to enhance the quality and credibility of the government’s scientific information. These studies constitute influential scientific information under OMB’s definition, 1 and thus these studies must be subject to peer review. OMB further requires Federal Agen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-86523-8_17